20 research outputs found

    Controlling docking, altitude and speed in a circular high-roofed tunnel thanks to the optic flow

    No full text
    International audienceThe new robot called BeeRotor we have developed is a tandem rotorcraft that mimicks optic flow-based behaviors previously observed in flies and bees. This tethered miniature robot (80g), which is autonomous in terms of its computational power requirements, is equipped with a 13.5-g quasi-panoramic visual system consisting of 4 individual visual motion sensors responding to the optic flow generated by photographs of natural scenes, thanks to the bio-inspired "time of travel" scheme. Based on recent findings on insects' sensing abilities and control strategies, the BeeRotor robot was designed to use optic flow to perform complex tasks such as ground and ceiling following while also automatically driving its forward speed on the basis of the ventral or dorsal optic flow. In addition, the BeeRotor robot can perform tricky manoeuvers such as automatic ceiling docking by simply regulating its dorsal or ventral optic flow in high-roofed tunnel depicting natural scenes. Although it was built as a proof of concept, the BeeRotor robot is one step further towards achieving a fully- autonomous micro-helicopter which is capable of navigating mainly on the basis of the optic flow

    The Vertical Optic Flow: An Additional Cue for Stabilizing Beerotor Robot’s Flight Without IMU

    No full text
    International audienceBio-inspired guidance principles involving no reference frame are presented here and were implemented in a rotorcraft called Beerotor, which was equipped with a minimalistic panoramic optic flow sensor and no accelerometer, no inertial measurement unit (IMU) [9], as in flying insects (Dipterian only uses rotation rates). In the present paper, the vertical optic flow was used as an additional cue whereas the previously published Beerotor II's visuo-motor system only used translational op-tic flow cues [9]. To test these guidance principles, we built a tethered tandem rotorcraft called Beerotor (80g), which flies along a high-roofed tunnel. The aerial robot adjusts its pitch and hence its speed, hugs the ground and lands safely without any need for an inertial reference frame. The rotorcraft's altitude and forward speed are adjusted via several op-tic flow feedback loops piloting respectively the lift and the pitch angle on the basis of the common-mode and differential rotor speeds, respectively as well as an active system of reorientation of a quasi-panoramic eye which constantly realigns its gaze, keeping it parallel to the nearest surface followed. Safe automatic terrain following and landing were obtained with the active eye-reorientation system over rugged terrain, without any need for an inertial reference frame

    A mouse sensor and a 2-pixel motion sensor exposed to continuous illuminance changes

    No full text
    International audienceConsiderable attention has been paid during the last decade to navigation systems based on the use of visual optic flow cues, especially for guiding autonomous robots designed to travel under specific lighting conditions. In the present study, the performances of two visual motion sensors used to measure a local 1-D angular speed, namely (i) a bio-inspired 2-pixel motion sensor and (ii) an off-the-shelf mouse sensor, were tested for the first time in a wide range of illuminance levels. The sensors' characteristics were determined here by recording their responses to a purely rotational optic flow generated by rotating the sensors mechanically and comparing their responses with an accurate rate gyro output signal. The refresh rate, a key parameter for future optic flow-based robotic applications, was also defined and tested in these two sensors. The bio-inspired 2-pixel motion sensor was found to be more accurate indoors whereas the mouse sensor was found to be more efficient outdoors

    A two-directional 1-gram visual motion sensor inspired by the fly's eye

    No full text
    International audienceOptic flow based autopilots for Micro-Aerial Vehicles (MAVs) need lightweight, low-power sensors to be able to fly safely through unknown environments. The new tiny 6-pixel visual motion sensor presented here meets these demanding requirements in term of its mass, size and power consumption. This 1-gram, low-power, fly-inspired sensor accurately gauges the visual motion using only this 6-pixel array with two different panoramas and illuminance conditions. The new visual motion sensor's output results from a smart combination of the information collected by several 2-pixel Local Motion Sensors (LMSs), based on the \enquote{time of travel} scheme originally inspired by the common housefly's Elementary Motion Detector (EMD) neurons. The proposed sensory fusion method enables the new visual sensor to measure the visual angular speed and determine the main direction of the visual motion without any prior knowledge. By computing the median value of the output from several LMSs, we also ended up with a more robust, more accurate and more frequently refreshed measurement of the 1-D angular speed

    Insect inspired visual motion sensing and flying robots

    Get PDF
    International audienceFlying insects excellently master visual motion sensing techniques. They use dedicated motion processing circuits at a low energy and computational costs. Thanks to observations obtained on insect visual guidance, we developed visual motion sensors and bio-inspired autopilots dedicated to flying robots. Optic flow-based visuomotor control systems have been implemented on an increasingly large number of sighted autonomous robots. In this chapter, we present how we designed and constructed local motion sensors and how we implemented bio-inspired visual guidance scheme on-board several micro-aerial vehicles. An hyperacurate sensor in which retinal micro-scanning movements are performed via a small piezo-bender actuator was mounted onto a miniature aerial robot. The OSCAR II robot is able to track a moving target accurately by exploiting the microscan-ning movement imposed to its eye's retina. We also present two interdependent control schemes driving the eye in robot angular position and the robot's body angular position with respect to a visual target but without any knowledge of the robot's orientation in the global frame. This "steering-by-gazing" control strategy, which is implemented on this lightweight (100 g) miniature sighted aerial robot, demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    Miniature curved artificial compound eyes.

    Get PDF
    International audienceIn most animal species, vision is mediated by compound eyes, which offer lower resolution than vertebrate single-lens eyes, but significantly larger fields of view with negligible distortion and spherical aberration, as well as high temporal resolution in a tiny package. Compound eyes are ideally suited for fast panoramic motion perception. Engineering a miniature artificial compound eye is challenging because it requires accurate alignment of photoreceptive and optical components on a curved surface. Here, we describe a unique design method for biomimetic compound eyes featuring a panoramic, undistorted field of view in a very thin package. The design consists of three planar layers of separately produced arrays, namely, a microlens array, a neuromorphic photodetector array, and a flexible printed circuit board that are stacked, cut, and curved to produce a mechanically flexible imager. Following this method, we have prototyped and characterized an artificial compound eye bearing a hemispherical field of view with embedded and programmable low-power signal processing, high temporal resolution, and local adaptation to illumination. The prototyped artificial compound eye possesses several characteristics similar to the eye of the fruit fly Drosophila and other arthropod species. This design method opens up additional vistas for a broad range of applications in which wide field motion detection is at a premium, such as collision-free navigation of terrestrial and aerospace vehicles, and for the experimental testing of insect vision theories

    Flying over uneven moving terrain based on optic-flow cues without any need for reference frames or accelerometers

    Get PDF
    International audienceTwo bio-inspired guidance principles involving no reference frame are presented here and were implemented in a rotorcraft, which was equipped with panoramic optic flow (OF) sensors but (as in flying insects) no accelerometer. To test these two guidance principles, we built a tethered tandem rotorcraft called BeeRotor (80 grams), which was tested flying along a high-roofed tunnel. The aerial robot adjusts its pitch and hence its speed, hugs the ground and lands safely without any need for an inertial reference frame. The rotorcraft's altitude and forward speed are adjusted via two OF regulators piloting the lift and the pitch angle on the basis of the common-mode and differential rotor speeds, respectively. The robot equipped with two wide-field OF sensors was tested in order to assess the performances of the following two systems of guidance involving no inertial reference frame: (i) a system with a fixed eye orientation based on the curved artificial compound eye (CurvACE) sensor, and (ii) an active system of reorientation based on a quasi-panoramic eye which constantly realigns its gaze, keeping it parallel to the nearest surface followed. Safe automatic terrain following and landing were obtained with CurvACE under dim light to daylight conditions and the active eye-reorientation system over rugged, changing terrain, without any need for an inertial reference frame

    Flying robot inspired by insects : From optic flow sensing to visually guided strategies to control a Micro Aerial Vehicle

    No full text
    Dans ce travail, nous avons premièrement développé et caractérisé des capteurs de flux optique robustes aux changements de conditions lumineuses inspirés par le système visuel de la mouche et mesurant la vitesse angulaire à l'aide de l'algorithme appelé "time of travel". En particulier, nous avons comparé les performances de capteurs mesurant visuellement la vitesse angulaire en intérieur et en extérieur. Les résultats de nos capteurs bio-inspirés ont aussi été comparés avec des capteurs de souris optique. Enfin, une nouvelle implémentation de l'algorithme "time of travel" a été proposée réduisant la charge de calcul de l'unité de traitement.Dans le cadre du projet européen CurvACE (Curved Artificial Compound Eye), nous avons aussi participé au développement du premier oeil composé courbé artificiel capable de mesurer le flux optique à haute vitesse sur une large gamme de lumière ambiante. En particulier, nous avons caractérisé ce capteur et montré sa capacité à mesurer le flux optique à l'aide de plusieurs algorithmes.Finalement, nous avons aussi développé un robot aérien miniature attaché appelé BeeRotor équipé de capteurs et de stratégies de vol imitant les insectes volants et se déplaçant de manière autonome dans un tunnel contrasté. Ce robot peut expliquer comment les abeilles contrôlent leur vitesse et leur position à l'aide du flux optique, tout en démontrant que des solutions alternatives existent aux systèmes couramment utilisés en robotique. Basé seulement sur des boucles de contrôle réagissant à l'environnement, cet hélicoptère a démontré sa capacité à voler de manière autonome dans un environnement complexe et mobile.In this thesis, we first developed and characterized optic flow sensors robust to illuminance changes inspired by the visual system of the fly and computing the angular speed thanks to the "time of travel" scheme. In particular, we have compared the performances of sensors processing the visual angular speed based on a standard retina or an aVLSI retina composed of pixels automatically adapting to the background illuminance in indoor and outdoor environments. The results of such bio-inspired sensors have also been compared with optic mouse sensors which are used nowadays on Micro Aerial Vehicles to process the optic flow but only in outdoor environments. Finally, a new implementation of the "time of travel" scheme has been proposed reducing the computational load of the processing unit.In the framework of the European project CurvACE, we also participated to the design and development of the first curved artificial compound eye including fast motion detection in a very large range of illuminations. In particular, we characterized such sensor showing its ability to extract optic flow using different algorithms.Finally, we also developed a tethered miniature aerial robot equipped with sensors and control strategies mimicking flying insects navigating in a high-roof tunnel. This robot may explain how honeybees control their speed and position thanks to optic flow, while demonstrating alternative solution to classical robotic approach relying on ground-truth and metric sensors. Based only on visuomotor control loops reacting suitably to the environment, this rotorcraft has shown its ability to fly autonomously in complex and unstationary tunnels

    Visual motion sensing onboard a 50-g helicopter flying freely under complex VICON-lighting conditions

    No full text
    International audienceIn previous studies, we described how complicated tasks such as ground avoidance, terrain following, takeoff and landing can be performed using optic flow sensors mounted on a tethered flying robot called OCTAVE. In the present study, a new programmable visual motion sensor connected to a lightweight Bluetooth module was mounted on a free-flying 50-gram helicopter called TwinCoax. This small helicopter model equipped with 3 IR-reflective markers was flown in a dedicated room equipped with a VICON system to record its trajectory. The results of this study show that despite the complex, adverse lighting conditions, the optic flow measured onboard matched the ground-truth optic flow generated by the free-flying helicopter's trajectory quite exactly
    corecore